# Abstractive summarization

Lacia Sum Small V1
A Russian text summarization model fine-tuned based on d0rj/rut5-base-summ, optimized for processing Russian text and capable of generating abstractive summaries.
Text Generation Transformers Supports Multiple Languages
L
LaciaStudio
380
0
Lacia Sum Small V1
A Russian automatic text summarization model fine-tuned based on d0rj/rut5-base-summ, optimized for Russian text and supporting abstractive summarization generation.
Text Generation Transformers Supports Multiple Languages
L
2KKLabs
136
0
BART No Extraction V2
A BART fine-tuned model for long legal document summarization, employing multi-stage summarization approach for complex legal texts
Text Generation Transformers English
B
MikaSie
280
0
T5 Summarizer Model
MIT
A text summarization model fine-tuned based on T5-small, specifically designed to generate concise, coherent, and informative summaries from lengthy texts.
Text Generation Transformers English
T
KipperDev
25
1
Dansumt5 Small
Apache-2.0
Danish news summarization model based on Google's mT5 architecture, fine-tuned using the DaNewsroom dataset
Text Generation Transformers Other
D
Danish-summarisation
32
0
Ptt5 Base Summ Xlsum
MIT
A Brazilian Portuguese abstractive text summarization model fine-tuned on PTT5, supporting summarization for various text types including news.
Text Generation Transformers Other
P
recogna-nlp
3,754
16
Mbart Finetune En Cnn
An English abstractive summarization model fine-tuned on MBART-large-50, trained on the CNN/DailyMail dataset, excelling at generating concise summaries for news articles.
Text Generation Transformers English
M
eslamxm
19
0
Mt5 Base Finetuned Fa
Apache-2.0
A Persian summarization model fine-tuned on pn_summary dataset based on google/mt5-base
Text Generation Transformers Other
M
ahmeddbahaa
20
1
Arat5 Base Finetune Ar Xlsum
Abstractive summarization model fine-tuned on the Arabic summarization dataset xlsum based on AraT5-base
Text Generation Transformers Arabic
A
ahmeddbahaa
15
0
T5 Arabic Base Finetuned Wikilingua Ar
Apache-2.0
This is an Arabic summarization model based on the T5 architecture, fine-tuned on the wiki_lingua dataset.
Text Generation Transformers Arabic
T
ahmeddbahaa
16
1
Mt5 Base Dacsa Es
This model is a fine-tuned version of the mT5 base model specifically for Spanish text summarization tasks, particularly suitable for generating summaries of news articles.
Text Generation Transformers Spanish
M
ELiRF
154
2
Staging Pegasus Gmeetsamsum
PEGASUS is a Transformer-based pre-trained model specifically designed for abstractive summarization tasks. It achieves outstanding performance on multiple summarization datasets through gap-sentence pre-training.
Text Generation Transformers English
S
kmfoda
14
0
Rut5 Base Absum
MIT
This is a Russian abstractive summarization model based on the T5 architecture and fine-tuned on multiple datasets, capable of generating concise and accurate text summaries.
Text Generation Transformers Other
R
cointegrated
1,135
27
Mbart Summarization Fanpage
An Italian abstractive summarization model fine-tuned on the Fanpage dataset based on mBART-large
Text Generation Transformers Other
M
ARTeLab
30
0
Rugpt3medium Sum Gazeta
Apache-2.0
Russian abstractive summarization model based on rugpt3medium_based_on_gpt2, specifically trained on the Gazeta dataset
Text Generation Transformers Other
R
IlyaGusev
1,228
4
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase